14. Fake and Real Losses
10 Fake Real Losses V1
Binary Cross Entropy Loss
We've mostly used plain cross entrpy loss in this program, which is a negative log loss applied to the output of a softmax layer. For a binary classification problem, as in real or fake image data, we can calculate the binary cross entropy loss as:
-[y\log(\hat{y}) +(1-y) \log (1-\hat{y})]
In other words, a sum of two log losses!
You can read the PyTorch documentation, here.